# Efficient Computation
Falcon H1 1.5B Instruct
Other
Falcon-H1 is an efficient hybrid architecture language model developed by TII, combining the strengths of Transformers and Mamba architectures, supporting English and multilingual tasks.
Large Language Model
Transformers

F
tiiuae
1,022
4
Ring Lite Linear Preview
MIT
The Linglong Linear Preview is a hybrid linear sparse large language model open-sourced by InclusionAI, with a total of 17.1B parameters and 3.0B activated parameters. This model implements long-text reasoning based on a hybrid linear attention mechanism, achieving near-linear computational complexity and near-constant space complexity during inference.
Large Language Model Supports Multiple Languages
R
inclusionAI
25
8
Kanana Nano 2.1b Embedding
Kanana is a bilingual (Korean/English) language model series developed by Kakao, excelling in Korean tasks while remaining competitive in English tasks, significantly reducing computational costs compared to models of similar scale.
Large Language Model
Transformers Supports Multiple Languages

K
kakaocorp
7,722
20
Featured Recommended AI Models